Approximate tensor decompositions: Disappearance of many separations

نویسندگان

چکیده

It is well-known that tensor decompositions show separations, is, constraints on local terms (such as positivity) may entail an arbitrarily high cost in their representation. Here we many of these separations disappear the approximate case. Specifically, for every approximation error $\varepsilon$ and norm, define rank minimum element $\varepsilon$-ball with respect to norm. For positive semidefinite matrices, between rank, purification separable a large class Schatten $p$-norms. nonnegative tensors, all $\ell_p$-norms $p>1$. trace norm ($p = 1$), obtain upper bounds depend ambient dimension. We also provide deterministic algorithm decomposition attaining our bounds. Our main tool version Carath\'eodory's Theorem. results imply are not robust under small perturbations tensor, implications quantum many-body systems communication complexity.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Inference in Graphical Models using Tensor Decompositions

We demonstrate that tensor decompositions can be used to transform graphical models into structurally simpler graphical models that approximate the same joint probability distribution. In this way, standard inference algorithms such as the junction tree algorithm, can be used in order to use the transformed graphical model for approximate inference. The usefulness of the technique is demonstrat...

متن کامل

Canonical Tensor Decompositions

The Singular Value Decomposition (SVD) may be extended to tensors at least in two very different ways. One is the High-Order SVD (HOSVD), and the other is the Canonical Decomposition (CanD). Only the latter is closely related to the tensor rank. Important basic questions are raised in this short paper, such as the maximal achievable rank of a tensor of given dimensions, or the computation of a ...

متن کامل

Orthogonal Tensor Decompositions

We explore the orthogonal decomposition of tensors (also known as multidimensional arrays or n-way arrays) using two different definitions of orthogonality. We present numerous examples to illustrate the difficulties in understanding such decompositions. We conclude with a counterexample to a tensor extension of the Eckart-Young SVD approximation theorem by Leibovici and Sabatier [Linear Algebr...

متن کامل

Tensor Decompositions and Applications

This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N-way array. Decompositions of higher-order tensors (i.e., N-way arrays with N ≥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, grap...

متن کامل

Regularized Computation of Approximate Pseudoinverse of Large Matrices Using Low-Rank Tensor Train Decompositions

We propose a new method for low-rank approximation of Moore-Penrose pseudoinverses (MPPs) of large-scale matrices using tensor networks. The computed pseudoinverses can be useful for solving or preconditioning large-scale overdetermined or underdetermined systems of linear equations. The computation is performed efficiently and stably based on the modified alternating least squares (MALS) schem...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Mathematical Physics

سال: 2021

ISSN: ['0022-2488', '1527-2427', '1089-7658']

DOI: https://doi.org/10.1063/5.0033876